Fast Inference and Learning with Sparse Belief Propagation

نویسنده

  • Chris Pal
چکیده

Even in trees, exact probabilistic inference can be expensive when the cardinality of the variables is large. This is especially troublesome for learning, because many standard estimation techniques, such as EM and conditional maximum likelihood, require calling an inference algorithm many times. In max-product inference, a standard heuristic for controlling this complexity in linear chains is beam search, that is, to ignore variable configurations during inference once their estimated probability becomes sufficiently low. Although quite effective for max-product, during sum-product inference beam search discards probability mass in a way that makes learning unstable. In this paper, we introduce a variational perspective on beam search that uses a approximating mixture of Kronecker delta functions. This motivates a novel variational approximation for arbitrary tree-structured models, which maintains an adaptivelysized sparse belief state—thus extending beam search from max-product to sum-product, and from linear chains to arbitrary trees. We report efficiency improvements for max-product inference over other beam search techniques. Also, unlike heuristic methods for discarding probability mass, our method can be used effectively for conditional maximum likelihood training. On both synthetic and real-world problems, we report four-fold increases in learning speed with no loss in accuracy.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Message Passing and Efficiently Learning Random Fields for Stereo Vision

Message passing algorithms based on variational methods and belief propagation are widely used for approximate inference in a variety of directed and undirected graphical models. However, inference can become extremely slow when the cardinality of the state space of individual variables is high. In this paper we explore sparse message passing to dramatically accelerate approximate inference. We...

متن کامل

Efficiently Learning Random Fields for Stereo Vision with Sparse Message Passing

As richer models for stereo vision are constructed, there is a growing interest in learning model parameters. To estimate parameters in Markov Random Field (MRF) based stereo formulations, one usually needs to perform approximate probabilistic inference. Message passing algorithms based on variational methods and belief propagation are widely used for approximate inference in MRFs. Conditional ...

متن کامل

Minimizing Sparse High-Order Energies by Submodular Vertex-Cover

Inference in high-order graphical models has become important in recent years. Several approaches are based, for example, on generalized message-passing, or on transformation to a pairwise model with extra ‘auxiliary’ variables. We focus on a special case where a much more efficient transformation is possible. Instead of adding variables, we transform the original problem into a comparatively s...

متن کامل

Dynamic quantization for belief propagation in sparse spaces

Graphical models provide an attractive framework for modeling a variety of problems in computer vision. The advent of powerful inference techniques such as belief propagation (BP) has recently made inference with many of these models tractable. Even so, the enormous size of the state spaces required for some applications can create a heavy computational burden. Pruning is a standard technique f...

متن کامل

Continuous Graphical Models for Static and Dynamic Distributions: Application to Structural Biology

Generative models of protein structure enable researchers to predict the behavior of proteins under different conditions. Continuous graphical models are powerful and efficient tools for modeling static and dynamic distributions, which can be used for learning generative models of molecular dynamics. In this thesis, we develop new and improved continuous graphical models, to be used in modeling...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005